Parallel layer perceptron

نویسندگان

  • Walmir M. Caminhas
  • Douglas A. G. Vieira
  • João A. Vasconcelos
چکیده

In this paper, both the architecture and learning procedure underlying the parallel layer perceptron is presented. This topology, di1erent to the previous ones, uses parallel layers of perceptrons to map nonlinear input–output relationships. Comparisons between the parallel layer perceptron, multi-layer perceptron and ANFIS are included and show the e1ectiveness of the proposed topology. c © 2003 Elsevier B.V. All rights reserved.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Kinematic Synthesis of Parallel Manipulator via Neural Network Approach

In this research, Artificial Neural Networks (ANNs) have been used as a powerful tool to solve the inverse kinematic equations of a parallel robot. For this purpose, we have developed the kinematic equations of a Tricept parallel kinematic mechanism with two rotational and one translational degrees of freedom (DoF). Using the analytical method, the inverse kinematic equations are solved for spe...

متن کامل

Practical Performance and Credit Assignment Efficiency of Analog Multi-layer Perceptron Perturbation Based Training Algorithms

Many algorithms have been recently reported for the training of analog multi-layer perceptron. Most of these algorithms were evaluated either from a computational or simulation view point. This paper applies several of these algorithms to the training of an analog multi-layer perceptron chip. The advantages and shortcomings of these algorithms in terms of training and generalisation performance...

متن کامل

A Pmlp Based Method for Chaotic Time Series Prediction

This paper proposes a new method for prediction of chaotic time series based on Parallel Multi-Layer Perceptron (PMLP) net and dynamics reconstruction technique. The PMLP contains a number of multi-layer perceptron (MLP) subnets connected in parallel. Each MLP subnet predicts the future data independently with a different embedding dimension. The PMLP determines the final predicted result accor...

متن کامل

A learning rule for very simple universal approximators consisting of a single layer of perceptrons

One may argue that the simplest type of neural networks beyond a single perceptron is an array of several perceptrons in parallel. In spite of their simplicity, such circuits can compute any Boolean function if one views the majority of the binary perceptron outputs as the binary output of the parallel perceptron, and they are universal approximators for arbitrary continuous functions with valu...

متن کامل

A two-layer paradigm capable of forming arbitrary decision regions in input space

It is well known that a two-layer perceptron network with threshold neurons is incapable of forming arbitrary decision regions in input space, while a three-layer perceptron has that capability. The effect of replacing the output neuron in a two-layer perceptron with a bithreshold element is studied. The limitations of this modified two-layer perceptron are observed. Results on the separating c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neurocomputing

دوره 55  شماره 

صفحات  -

تاریخ انتشار 2003